Event-Based Stereo Visual Odometry

نویسندگان

چکیده

Event-based cameras are bioinspired vision sensors whose pixels work independently from each other and respond asynchronously to brightness changes, with microsecond resolution. Their advantages make it possible tackle challenging scenarios in robotics, such as high-speed high dynamic range scenes. We present a solution the problem of visual odometry data acquired by stereo event-based camera rig. Our system follows parallel tracking-and-mapping approach, where novel solutions subproblem (three-dimensional (3-D) reconstruction pose estimation) developed two objectives mind: being principled efficient, for real-time operation commodity hardware. To this end, we seek maximize spatio-temporal consistency while using simple efficient representation. Specifically, mapping module builds semidense 3-D map scene fusing depth estimates multiple viewpoints (obtained consistency) probabilistic fashion. The tracking recovers rig solving registration that naturally arises due chosen event Experiments on publicly available datasets our own recordings demonstrate versatility proposed method natural scenes general 6-DoF motion. successfully leverages perform illumination conditions, low-light range, running standard CPU. release software dataset under an open source license foster research emerging topic simultaneous localization mapping.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Direct Stereo Visual Odometry based on Lines

We propose a novel stereo visual odometry approach, which is especially suited for poorly textured environments. We introduce a novel, fast line segment detector and matcher, which detects vertical lines supported by an IMU. The patches around lines are then used to directly estimate the pose of consecutive cameras by minimizing the photometric error. Our algorithm outperforms state-of-the-art ...

متن کامل

High Altitude Stereo Visual Odometry

Stereo visual odometry has received little investigation in high altitude applications due to the generally poor performance of rigid stereo rigs at extremely small baseline-to-depth ratios. Without additional sensing, metric scale is considered lost and odometry is seen as effective only for monocular perspectives. This paper presents a novel modification to stereo based visual odometry that a...

متن کامل

Fast Stereo-based Visual Odometry for Rover Navigation

The object of visual odometry is the computation of the path of a rover from onboard passive vision data only. The approach presented here relies on the accumulation of ego-motion estimates obtained by stereo vision and bundle adjustment of tracked feature points. We also propose a new feature detector/descriptor, which is a simplified and faster form of other well known descriptors (SURF). For...

متن کامل

Noise Models in Feature-based Stereo Visual Odometry

Feature-based visual structure and motion reconstruction pipelines, common in visual odometry and large-scale reconstruction from photos, use the location of corresponding features in different images to determine the 3D structure of the scene, as well as the camera parameters associated with each image. The noise model, which defines the likelihood of the location of each feature in each image...

متن کامل

Features detection for Stereo Visual Odometry

Estimating its ego-motion is one of the most important capabilities for an autonomous mobile platform. Without reliable ego-motion estimation no long-term navigation is possible. Besides odometry, inertial sensors, DGPS, laser range finders and so on, vision based algorithms can contribute a lot of information. Stereo odometry is a vision based motion estimation algorithm that estimates the ego...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Robotics

سال: 2021

ISSN: ['1552-3098', '1941-0468', '1546-1904']

DOI: https://doi.org/10.1109/tro.2021.3062252